23 research outputs found

    Methods and design issues for next generation network-aware applications

    Get PDF
    Networks are becoming an essential component of modern cyberinfrastructure and this work describes methods of designing distributed applications for high-speed networks to improve application scalability, performance and capabilities. As the amount of data generated by scientific applications continues to grow, to be able to handle and process it, applications should be designed to use parallel, distributed resources and high-speed networks. For scalable application design developers should move away from the current component-based approach and implement instead an integrated, non-layered architecture where applications can use specialized low-level interfaces. The main focus of this research is on interactive, collaborative visualization of large datasets. This work describes how a visualization application can be improved through using distributed resources and high-speed network links to interactively visualize tens of gigabytes of data and handle terabyte datasets while maintaining high quality. The application supports interactive frame rates, high resolution, collaborative visualization and sustains remote I/O bandwidths of several Gbps (up to 30 times faster than local I/O). Motivated by the distributed visualization application, this work also researches remote data access systems. Because wide-area networks may have a high latency, the remote I/O system uses an architecture that effectively hides latency. Five remote data access architectures are analyzed and the results show that an architecture that combines bulk and pipeline processing is the best solution for high-throughput remote data access. The resulting system, also supporting high-speed transport protocols and configurable remote operations, is up to 400 times faster than a comparable existing remote data access system. Transport protocols are compared to understand which protocol can best utilize high-speed network connections, concluding that a rate-based protocol is the best solution, being 8 times faster than standard TCP. An HD-based remote teaching application experiment is conducted, illustrating the potential of network-aware applications in a production environment. Future research areas are presented, with emphasis on network-aware optimization, execution and deployment scenarios

    Evaluation of Damages Caused by Floods, Based on Satellite Images. Case Study: Jijia River, Slobozia-Dângeni Sector, July 2010

    No full text
    This research aimed to identify flooded areas following the July 2010 floods, using Landsat 7-ETM + satellite imagery and a more efficient way to extract water bodies. By computing several indices, such as MNDWI, NDWI, NDVI, AWI, WRI and NDMI, it was concluded that, in the present case, the NDWI index was most effective, the data obtained having a very good accuracy. The studied area was the Jijia River Slobozia-Dângeni sector, the Landsat 7-ETM + images were taken on July 3, 2010. The flow rate at this time at the Dângeni station was 473 cm, decreasing compared to July 1, 2010 when the share reached 579 cm. The flooded area obtained is 15.80 km2, the maximum extension of the flood area on July 3, 2010 being approx. 1 km on the localities of Durneşti and Sapoveni. The study found 143 houses in 19 localities flooded. Of the total flooded areas, the largest share is held by arable land (44.58%), with a surface area of 7.04 km2

    Remote Partial File Access Using Compact Pattern Descriptions

    No full text
    We present a method for the efficient access to parts of remote files. The efficiency is achieved by using a file format independent compact pattern description, that allows to request several parts of a file in a single operation. This results in a drastically reduced number of remote operations and network latencies if compared to common solutions. We measured the time to access parts of remote files with compact patterns, compared it with normal GridFTP remote partial file access and observed a significant performance increase. Further, we discuss, how..

    Interactive Exploration of Large Remote Micro-CT Scans

    No full text
    Datasets of tens of gigabytes are getting common in computational and experimental science. This development is driven by advances in imaging technology, producing detectors with ever-growing resolutions, as well as availability of cheap processing power and memory capacity in commodity based computing clusters

    ABSTRACT Interactive Exploration of Large Remote Micro-CT Scans

    No full text
    Datasets of tens of gigabytes are becoming common in computational and experimental science. This development is driven by advances in imaging technology, producing detectors with growing resolutions, as well as availability of cheap processing power and memory capacity in commodity-based computing clusters. In this article we describe the design of a visualization system that allows scientists to interactively explore large remote data sets in an efficient and flexible way. The system is broadly applicable and currently used by medical scientists conducting an osteoporosis research project. Human vertebral bodies are scanned using a high resolution micro-CT scanner producing scans of roughly 8 GB size each. All participating research groups require access to the centrally stored data. Due to the rich internal bone structure, scientists need to interactively explore the full dataset at coarse levels, as well as visualize subvolumes of interest at the highest resolution
    corecore